Skip to content

Conversation

@github-actions
Copy link

@github-actions github-actions bot commented Oct 5, 2025

This PR was automatically generated by GitHub Actions.

@sd109 sd109 closed this Oct 6, 2025
@sd109 sd109 reopened this Oct 6, 2025
@sd109 sd109 force-pushed the update/vllm-v0.11.0 branch 2 times, most recently from 75413b3 to ad2617b Compare October 24, 2025 13:59
@github-actions github-actions bot force-pushed the update/vllm-v0.11.0 branch from 6f7cb57 to a057e86 Compare October 26, 2025 09:04
@sd109 sd109 force-pushed the update/vllm-v0.11.0 branch from a057e86 to 42564e1 Compare October 26, 2025 21:16
sd109 and others added 2 commits October 28, 2025 08:14
There's some kind of incompatibility between the GitHub runner
environment and vLLM v0.11.0's CPU image. During the testing process
the vLLM API pod starts, logs some messages about 'Automatically
detected platform CPU' and the seemingly gets killed and enters a
crash loop. Enabling debug logging on vLLM with
VLLM_LOGGING_LEVEL=DEBUG doesn't provide any useful clues and the same
vLLM CPU tests work fine on a standard Ubuntu 24.04 VM outside of
GitHub actions. Disabling this CI test for now. TODO: Try re-enabling
this when a newer vLLM version is available.
@sd109 sd109 force-pushed the update/vllm-v0.11.0 branch from 7adad46 to 89cb1e2 Compare October 28, 2025 08:14
@sd109 sd109 merged commit 414a364 into main Oct 28, 2025
7 checks passed
@sd109 sd109 deleted the update/vllm-v0.11.0 branch October 28, 2025 09:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants